Improved computation for Levenberg-Marquardt training

نویسندگان

  • Bogdan M. Wilamowski
  • Hao Yu
چکیده

The improved computation presented in this paper is aimed to optimize the neural networks learning process using Levenberg-Marquardt (LM) algorithm. Quasi-Hessian matrix and gradient vector are computed directly, without Jacobian matrix multiplication and storage. The memory limitation problem for LM training is solved. Considering the symmetry of quasi-Hessian matrix, only elements in its upper/lower triangular array need to be calculated. Therefore, training speed is improved significantly, not only because of the smaller array stored in memory, but also the reduced operations in quasi-Hessian matrix calculation. The improved memory and time efficiencies are especially true for large sized patterns training.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fan Improved Algorithm of BP Neural Network Fault Diagnosis Research

According to the structure of the BP neural network and the algorithm, choose three methods of BP neural network algorithm was improved, through analysis and comparison, computing speed is faster, more accurate judgment Levenberg Marquardt algorithm as the improved algorithm of optimal; Using the algorithm to the established BP neural network for training analysis; Then use the Matlab software,...

متن کامل

A New Damping Strategy of Levenberg-marquardt Algorithm for Multilayer Perceptrons

In this paper, a new adjustment to the damping parameter of the Levenberg-Marquardt algorithm is proposed to save training time and to reduce error oscillations. The damping parameter of the Levenberg-Marquardt algorithm switches between a gradient descent method and the Gauss-Newton method. It also affects training speed and induces error oscillations when a decay rate is fixed. Therefore, our...

متن کامل

A New Cuckoo Search Based Levenberg-Marquardt (CSLM) Algorithm

Back propagation neural network (BPNN) algorithm is a widely used technique in training artificial neural networks. It is also a very popular optimization procedure applied to find optimal weights in a training process. However, traditional back propagation optimized with Levenberg marquardt training algorithm has some drawbacks such as getting stuck in local minima, and network stagnancy. This...

متن کامل

Speeding up the Training of Lattice–Ladder Multilayer Perceptrons

A lattice–ladder multilayer perceptron (LLMLP) is an appealing structure for advanced signal processing in a sense that it is nonlinear, possesses infinite impulse response and stability monitoring of it during training is simple. However, even moderate implementation of LLMLP training hinders the fact that a lot of storage and computation power must be allocated. In this paper we deal with the...

متن کامل

Improved Neural Network Training Algorithm for Classification of Compressed and Uncompressed Images

For managing data in a smart card’s limited memory, containing medical and biometric images, images compression is resorted to. For image retrieval, it is necessary that the classification algorithm be efficient to search and locate the image in a compressed domain. This study proposes a novel training algorithm for Multi-Layer Perceptron Neural Network (MLP-NN) to classify compressed images. M...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 21 6  شماره 

صفحات  -

تاریخ انتشار 2010